177 research outputs found

    Investing under model uncertainty: decision based evaluation of exchange rate forecasts in the US, UK and Japan

    Get PDF
    We evaluate the forecast performance of a range of theory-based and atheoretical models explaining exchange rates in the US, UK and Japan. A decision-making environment is fully described for an investor who optimally allocates portfolio shares to domestic and foreign assets. Methods necessary to compute and use forecasts in this context are proposed, including the means of combining density forecasts to deal with model uncertainty. An out-of-sample forecast evaluation exercise is described using both statistical criteria and decision-based criteria. The theory-based models are found to perform relatively well when their forecasts are judged by their economic value

    Measuring output gap uncertainty

    Get PDF
    We propose a methodology for producing density forecasts for the output gap in real time using a large number of vector autoregessions in inflation and output gap measures. Density combination utilizes a linear mixture of experts framework to produce potentially non-Gaussian ensemble densities for the unobserved output gap. In our application, we show that data revisions alter substantially our probabilistic assessments of the output gap using a variety of output gap measures derived from univariate detrending filters. The resulting ensemble produces well-calibrated forecast densities for US inflation in real time, in contrast to those from simple univariate autoregressions which ignore the contribution of the output gap. Combining evidence from both linear trends and more flexible univariate detrending filters induces strong multi-modality in the predictive densities for the unobserved output gap. The peaks associated with these two detrending methodologies indicate output gaps of opposite sign for some observations, reflecting the pervasive nature of model uncertainty in our US data

    Measuring the natural output gap using actual and expected output data

    Get PDF
    An output gap measure is suggested based on the Beveridge-Nelson decomposition of output using a vector-autoregressive model that includes data on actual output and on expected output obtained from surveys. The paper explains the advantages of using survey data in business cycle analysis and the gap is provided economic meaning by relating it to the natural level of output defined in Dynamic Stochastic General Equilibrium models. The measure is applied to quarterly US data over the period 1970q1-2007q4 and the resultant gap estimates are shown to have sensible statistical properties and perform well in explaining inflation in estimates of New Keynesian Phillips curves

    Measuring the Natural Output Gap Using Actual and Expected Output Data

    Get PDF
    An output gap measure is suggested based on a multivariate Beveridge-Nelson decomposition of output using a vector-autoregressive model that includes data on actual output and on expected output obtained from surveys. The gap is estimated using an integrated approach to identifying permanent and different types of transitory shocks to output. The gap has a statistical basis but is provided economic meaning by relating it to natural output in DSGE models. The approach is applied to quarterly US data over 1970q1-2007q4. Estimated gaps have sensible statistical properties and perform well in explaining inflation in estimates of New Keynesian Phillips curves.Trend Output, Natural Output Level, Output Gap, Beveridge-Nelson Decomposition, Survey-based Expectations, New Keynesian Phillips Curve.

    Measuring the Natural Output Gap using Actual and Expected Output Data

    Get PDF
    An output gap measure is suggested based on the Beveridge-Nelson decomposition of output using a vector-autoregressive model that includes data on actual output and on expected output obtained from surveys. The paper explains the advantages of using survey data in business cycle analysis and the gap is provided economic meaning by relating it to the natural level of output defined in Dynamic Stochastic General Equilibrium models. The measure is applied to quarterly US data over the period 1970q1-2007q4 and the resultant gap estimates are shown to have sensible statistical properties and perform well in explaining inflation in estimates of New Keynesian Phillips curves.Trend Output; Natural Output Level; Output Gap; Beveridge-Nelson Decomposition; Survey-based Expectations; New Keynesian Phillips Curve

    Decision Making in hard Times: What is a Recession, Why Do We Care and How Do We Know When We Are in One?

    Get PDF
    Defining a recessionary event as one which impacts adversely on individuals’ economic well-being, the paper argues that recession is a multi-faceted phenomenon whose meaning differs from person to person as it impacts on their decision-making in real time. It argues that recession is best represented through the calculation of the nowcast of recession event probabilities. A variety of such probabilities are produced using a real-time data set for the US for the period, focusing on the likelihood of various recessionary events through 1986q1-2008q4 and on prospects beyond the end of the sample.Recession; Probability Forecasts; Real Time

    Real-time representations of the output gap

    Get PDF
    Methods are described for the appropriate use of data obtained and analysed in real time to represent the output gap. The methods employ cointegrating VAR techniques to model real-time measures and realizations of output series jointly. The model is used to mitigate the impact of data revisions; to generate appropriate forecasts that can deliver economically meaningful output trends and that can take into account the end-of-sample problems encountered in measuring these trends; and to calculate probability forecasts that convey in a clear way the uncertainties associated with the gap measures. The methods are applied to data for the United States 1965q4–2004q4, and the improvements over standard methods are illustrated

    Real-time inflation forecast densities from ensemble phillips curves

    Get PDF
    A popular macroeconomic forecasting strategy takes combinations across many models to hedge against model instabilities of unknown timing; see (among others) Stock andWatson (2004) and Clark and McCracken (2009). In this paper, we examine the effectiveness of recursive-weight and equal-weight combination strategies for density forecasting using a time-varying Phillips curve relationship between inflation and the output gap. The densities reflect the uncertainty across a large number of models using many statistical measures of the output gap, allowing for a single structural break of unknown timing. We use real-time data for the US, Australia, New Zealand and Norway. Our main finding is that the recursive-weight strategy performs well across the real-time data sets, consistently giving well-calibrated forecast densities. The equal-weight strategy generates poorly-calibrated forecast densities for the US and Australian samples. There is little difference between the two strategies for our New Zealand and Norwegian data. We also find that the ensemble modeling approach performs more consistently with real-time data than with revised data in all four countries

    Some Considerations of the Ultimate Spatial Resolution Achievable in Scanning Transmission Electron Microscopy

    Get PDF
    The fundamental limitations on spatial resolution of X-ray microanalysis in the scanning transmission electron microscope are set by the interrelationships between the gun brightness, operating voltage, probe convergence angle, size and current, specimen thickness, beam broadening, the probability of characteristic and Bremsstrahlung X-ray production and the statistics of the X-ray spectrum. Manipulation of expressions describing these interrelationships leads to equations predicting the optimum probe size and specimen thickness for the best achievable spatial resolution (defined as the diameter of a cylinder containing 90% of the X-ray production) in microscopes fitted with different electron sources and operating at different voltages in foils of various elements. Application of these calculations to the special case of detecting monolayer segregation at grain boundaries results in predictions of the minimum amounts of such segregation that would be observable. It is found, for example, that in a microscope with a field-emission source operating at 500 keV, resolution of \u3c 1nm is obtainable in an iron foil 20nm thick, and in this case about 0.001 monolayer of chromium is detectable segregated at grain boundaries. The calculations do not take into account instrumental or experimental problems such as specimen drift, specimen preparation, etc., and represent the basic physical limits of performance of a perfect analytical microscope
    corecore